An Upper Bound on the Bayesian Error
نویسندگان
چکیده
In the Bayesian framework, predictions for a regression problem are expressed in terms of a distribution of output values. The mode of this distribution corresponds to the most probable output, while the uncertainty associated with the predictions can conveniently be expressed in terms of error bars. In this paper we consider the evaluation of error bars in the context of the class of generalized linear regression models. We provide insights into the dependence of the error bars on the location of the data points and we derive an upper bound on the true error bars in terms of the contributions from individual data points which are themselves easily evaluated.
منابع مشابه
ALGEBRAIC NONLINEARITY IN VOLTERRA-HAMMERSTEIN EQUATIONS
Here a posteriori error estimate for the numerical solution of nonlinear Voltena- Hammerstein equations is given. We present an error upper bound for nonlinear Voltena-Hammastein integral equations, in which the form of nonlinearity is algebraic and develop a posteriori error estimate for the recently proposed method of Brunner for these problems (the implicitly linear collocation method)...
متن کاملA Sharp Sufficient Condition for Sparsity Pattern Recovery
Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...
متن کاملA New Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications
The existing upper and lower bounds between entropy and error are mostly derived through an inequality means without linking to joint distributions. In fact, from either theoretical or application viewpoint, there exists a need to achieve a complete set of interpretations to the bounds in relation to joint distributions. For this reason, in this work we propose a new approach of deriving the bo...
متن کاملAn Upper Bound on the First Zagreb Index in Trees
In this paper we give sharp upper bounds on the Zagreb indices and characterize all trees achieving equality in these bounds. Also, we give lower bound on first Zagreb coindex of trees.
متن کاملAn Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications
In this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesian errors. The consideration of non-Bayesian errors is due to the facts that most classifiers res...
متن کاملAnalytical Bounds between Entropy and Error Probability in Binary Classifications
The existing upper and lower bounds between entropy and error probability are mostly derived from the inequality of the entropy relations, which could introduce approximations into the analysis. We derive analytical bounds based on the closed-form solutions of conditional entropy without involving any approximation. Two basic types of classification errors are investigated in the context of bin...
متن کامل